Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add filters

Language
Document Type
Year range
1.
Management Science ; 68(4):2860-2868, 2022.
Article in English | APA PsycInfo | ID: covidwho-2272996

ABSTRACT

Misinformation has emerged as a major societal challenge in the wake of the 2016 U.S. elections, Brexit, and the COVID-19 pandemic. One of the most active areas of inquiry into misinformation examines how the cognitive sophistication of people impacts their ability to fall for misleading content. In this paper, we capture sophistication by studying how misinformation affects the two canonical models of the social learning literature: sophisticated (Bayesian) and naive (DeGroot) learning. We show that sophisticated agents can be more likely to fall for misinformation. Our model helps explain several experimental and empirical facts from cognitive science, psychology, and the social sciences. It also shows that the intuitions developed in a vast social learning literature should be approached with caution when making policy decisions in the presence of misinformation. We conclude by discussing the relationship between misinformation and increased partisanship and provide an example of how our model can inform the actions of policymakers trying to contain the spread of misinformation. (PsycInfo Database Record (c) 2022 APA, all rights reserved)

2.
Management Science ; 68(4):2860-2868, 2022.
Article in English | ProQuest Central | ID: covidwho-1833466

ABSTRACT

Misinformation has emerged as a major societal challenge in the wake of the 2016 U.S. elections, Brexit, and the COVID-19 pandemic. One of the most active areas of inquiry into misinformation examines how the cognitive sophistication of people impacts their ability to fall for misleading content. In this paper, we capture sophistication by studying how misinformation affects the two canonical models of the social learning literature: sophisticated (Bayesian) and naive (DeGroot) learning. We show that sophisticated agents can be more likely to fall for misinformation. Our model helps explain several experimental and empirical facts from cognitive science, psychology, and the social sciences. It also shows that the intuitions developed in a vast social learning literature should be approached with caution when making policy decisions in the presence of misinformation. We conclude by discussing the relationship between misinformation and increased partisanship and provide an example of how our model can inform the actions of policymakers trying to contain the spread of misinformation.

SELECTION OF CITATIONS
SEARCH DETAIL